Training a perceptron in a discrete weight space.

نویسندگان

  • M Rosen-Zvi
  • I Kanter
چکیده

Learning in a perceptron having a discrete weight space, where each weight can take 2L+1 different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as exp(-Kalpha(2)) in the case of on-line learning with binary activation functions and exp(-e(|lambda|alpha)) in the case of on-line learning with continuous one, where alpha is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for alpha~Lsqrt[ln(NL)]. A crossover to the generalization error approximately 1/alpha, characterizing continuous weights with binary output, is obtained for synaptic depth L>O(sqrt[N]).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Discrete Hybrid Teaching-Learning-Based Optimization algorithm for optimization of space trusses

In this study, to enhance the optimization process, especially in the structural engineering field two well-known algorithms are merged together in order to achieve an improved hybrid algorithm. These two algorithms are Teaching-Learning Based Optimization (TLBO) and Harmony Search (HS) which have been used by most researchers in varied fields of science. The hybridized algorithm is called A Di...

متن کامل

New full adders using multi-layer perceptron network

How to reconfigure a logic gate for a variety of functions is an interesting topic. In this paper, a different method of designing logic gates are proposed. Initially, due to the training ability of the multilayer perceptron neural network, it was used to create a new type of logic and full adder gates. In this method, the perceptron network was trained and then tested. This network was 100% ac...

متن کامل

ar X iv : c on d - m at / 9 70 52 57 v 1 2 6 M ay 1 99 7 On - line learning in a discrete state space

On-line learning of a rule given by an N-dimensional Ising perceptron, is considered for the case when the student is constrained to take values in a discrete state space of size L N. For L = 2 no on-line algorithm can achieve a finite overlap with the teacher in the thermodynamic limit. However, if L is on the order of √ N , Hebbian learning does achieve a finite overlap. Artificial neural net...

متن کامل

On-line Learning in a Discrete State Space

On-line learning of a rule given by an N-dimensional Ising perceptron, is considered for the case when the student is constrained to take values in a discrete state space of size L N. For L = 2 no on-line algorithm can achieve a nite overlap with the teacher in the thermodynamic limit. However, if L is on the order of p N, Hebbian learning does achieve a nite overlap. Artiicial neural networks ...

متن کامل

Convergence of online gradient methods for continuous perceptrons with linearly separable training patterns

h this paper, we prove that the online gradient method for continuous perceptrons converges in finite steps when the training patterns are linearly separable. @ 2003 Elsevier Ltd. All rights reserved. Neural networks have been widely used for solving supervised classification problems. In this paper, we consider the simplest feedforword neural network-the perceptron made up of m input neurons a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Physical review. E, Statistical, nonlinear, and soft matter physics

دوره 64 4 Pt 2  شماره 

صفحات  -

تاریخ انتشار 2001